Hacker News new | past | comments | ask | show | jobs | submit login

Interesting to see that the runner up - Newsday - was selected for using digital tools to expose shootings, beatings and other concealed misconduct by some Long Island police officers. This highlights the increasingly complimentary role of digital tools and traditional reporting.

Anyone know what kind of digital tools they used?

Anyone know of other digital tools journalists/the press use to investigate/uncover content?




Data journalism geek here: All sorts of tools, but really, Excel and Access are adequate enough to be game-changers...this is because data has no standardized form when you get to local jurisdictions.

Jeremy Singer-Vine, formerly of the Wall Street Journal, was named a Pulitzer finalist for National Reporting this year (http://www.pulitzer.org/finalists/2014?):

> John Emshwiller and Jeremy Singer-Vine of The Wall Street Journal - For their reports and searchable database on the nation’s often overlooked factories and research centers that once produced nuclear weapons and now pose contamination risks.

HNers may remember some of Jeremy's stuff recently making the front page, including:

Reverse Engineering xkcd's Frequency: https://news.ycombinator.com/item?id=7290868

Txtbirds: https://news.ycombinator.com/item?id=4763147


Excel and Access are adequate enough to be game-changers

Without doubting your point here, do you think access to data is driving the underlying ability to leverage these tools? Or is it just people have now started to look harder at the problems with an analytic toolkit in mind?


I think access to data is key. One of the hardest challenges is dealing -- in an empirical manner -- with paper documents, or anything that doesn't come in a spreadsheet...which, until relatively recently, was the norm of information distribution.

But I think we (as a society) are just beginning to make use of data, in terms of analysis and general computational thinking. To go back to the domain of journalism...Aron Pilhofer, who heads the interactive news team at the New York Times, said that in "one day...we can teach you the skills that if mastered would allow you to do 80 percent of all the computer-assisted reporting that has ever been done" (http://knight.stanford.edu/life-fellow/2012/times-editor-say...)...I think this is still the case.

As an example, you don't have to go back much further than last year's Public Service Pulitzer...probably my favorite winner in modern times: http://www.pulitzer.org/citation/2013-Public-Service

The reporters took a sensational story (off-duty cops being caught on YouTube egregiously breaking the speed limit) and opted to do an empirical analysis. But of course, what they were trying to find -- cops who broke the law by speeding -- was inherently non-existent, in terms of public records (because it is the cops who determine whether the law is broken). So instead, the reporters requested toll booth records, which recorded the passing through of each cop car. Taking distance divided by time, they were able to prove so convincingly how egregious the abuse was that Florida police stations pretty much rolled over and immediately repented.

Besides whatever database they used to hold the data, the analysis here is literally elementary level. This is not to say that the reporters' had it easy (they still had to do all the footwork, interviews, confrontations, and fact-checking, among other things), but it just goes to show you how many important stories are out there, in every jurisdiction, that only need someone who cares enough to do some counting and arithmetic. I kind of love the Public Service awards because of how they recognize these relatively non-sexy, but incredibly important stories done by determined and clever journalists.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: