Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Can you score blogs by truthfulness?
4 points by read on March 11, 2014 | hide | past | favorite | 8 comments
I'm wondering if there's a way to find blogs according to a custom search algorithm.

I know Reddit and HN rank posts by votes. But choosing stories that way suffers from side effects like:

(1) a story being popular because it's so new

(2) a story being liked by a lot of people because of the way it's written

(3) a story that is surprisingly eye opening says something people are not willing to believe, so they don't upvote it

(4) something being incorrect.

I'd prefer to read stories ranked by other measures. The most important one for me being: whether what they say is true.

Is there an algorithm, tool or service that scores stories by truthfulness?

edit: changed truthiness to truthfulness



This is a really interesting problem. I work for a media organization and we make our money by printing advertiser-friendly versions of stories. This primarily involves framing political or economic risks as "opportunities", omitting damaging facts/analyses, and exclusively quoting wealthy/powerful people.

Most media companies do some version of this, with newspapers having among the best reputations for getting at the truth, and advertorial supplements the worst. Still, one key factor that I believe affects the "truth" of an article is the revenue stream of the writer/publisher. If you want objective analysis of anything, you probably have to stop reading newspapers/magazines run on an advertising model, or at least be very, very selective.

There are a number of factors in play that can help determine the "truthfulness" of an article: objective facts, sources, expertise cited, existence of debate in the field being covered, etc... etc. As an editor, I think you are trying to find an automated solution to an intrinsically human problem, but maybe I'm just threatened because you're trying to outsource my job to a machine. Ultimately, I think learning how to read deeply is something that can be taught, but not set to an algorithm. At least, this appeals to my desire for job security.


>This primarily involves framing political or economic risks as "opportunities", omitting damaging facts/analyses, and exclusively quoting wealthy/powerful people.

This seems a bit disingenuous and sleazy, depending on who exactly is consuming this content.


Indeed.

The company makes money from advertising, so advertisers are its most important readers. This is true, to a greater or lesser extent, for any media that sells ad space. Some organizations maintain "Chinese walls" between the ad side and the editorial side, but these are businesses and they ultimately have to answer to their bottom lines. That's true for Fox News, the Washington Post, or a Financial Times supplement – they are all chasing money with their brand and curating content that conforms to the expectations advertisers have of said brand. Rolling Stone is an 'edgy' magazine, so they publish articles that critique America's rapacious capitalism, but they still run full-page adverts for Apple products.

So, speaking as an editor, I believe it is possible to address some of the questions about truth above with an algorithm, but not all. There isn't a substitute for reading with a skeptical eye, doing one's own research, balancing views from experts, and filtering that against the influence money has on the source's position. A computer could make some of that easier, but if someone offers me a digital product that helps me verify the truth of articles I read, for the low, low cost of only $2/month, I'm still going to doubt the veracity of what I'm getting.

I heard a story once that sort of speaks to the issues you describe. A group of American journalists went visited Moscow in the late 80s to visit the former Soviet counterparts. They discussed life in each others' countries and shared perspectives on working conditions in their trade. One American asked, "What was it like, being directed by the communists as to what you could and could not print?"

A Russian journalist responded, "Much the same as being directed by capitalists."


I've been thinking about this a lot with respect to user generated content and although this is somewhat tangential, it appears that people provide the most positive feedback for the version of truth that most resembles their own version, not some third party, objective viewer.

I had originally thought this was a modern phenomenon, but it turns out it's been going on for a long time.

Take for example Dürer's Rhinoceros: http://en.wikipedia.org/wiki/D%C3%BCrer's_Rhinoceros

His depiction is armor clad and has an extra horn, it looks fanciful, and yet, another artist, Hans Burgkmair, produced a depiction that is basically true to life, yet Dürer's print became famous and duplicated widely, and only one of Burgkmair's prints is extant.

Slightly more anecdotal, for a while my dad collected maps and I bought him an original from the 1700s that depicted California as an island, despite being well mapped since the 16th century. It was something of a fanciful myth and people preferred the "island" maps to display in their homes, while real navigators had accurate maps.

Anyways, what I'm trying to get at is, if you want to measure truthiness, you should do it relative to the observer, with some awareness that it's flexible, assuming you are thinking of this as a way to match content to people. If you just want some certificate of truth, because of the reasons above, it probably won't take hold with the general population (they will dismiss it for not agreeing with their beliefs)


Collectively agreeing on what is "true" is not an easy problem. The crowdsourced fact-checking of Wikipedia is probably what is the closest to what you are looking for.


You want to rank by truthfulness and not truthiness. :)

http://en.wikipedia.org/wiki/Truthiness


I don't think it can be done. Every group has a group think. Facts that fall outside some scope of their collective experience/understanding of the world, get dismissed as tall tales.


If you provide me the function that takes a chunk of text and returns its truth value, I'll happily finish the aggregation service for you.

As a free bonus, I'll also filter comments for you.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: